Towards Semantics-Enhanced Pre-Training: Can Lexicon Definitions Help Learning Sentence Meanings?

نویسندگان

چکیده

Self-supervised pre-training techniques, albeit relying on large amounts of text, have enabled rapid growth in learning language representations for natural understanding. However, as radically empirical models sentences, they are subject to the input data distribution, inevitably incorporating bias and reporting bias, which may lead inaccurate understanding sentences. To address this problem, we propose adopt a human learner's approach: when cannot make sense word sentence, often consult dictionary specific meanings; but can same work models? In work, try inform pre-trained masked meanings semantics-enhanced pre-training. achieve contrastive holistic view meanings, definition pair two related words is presented model such that better associate with its crucial semantic features. Both intrinsic extrinsic evaluations validate proposed approach semantics-orientated tasks, an almost negligible increase training data.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Towards an Inquiry-Based Language Learning: Can a Wiki Help?

Wiki use may help EFL instructors to create an effective learning environment for inquiry-based language teaching and learning. The purpose of this study was to investigate the effects of wikis on the EFL learners’ IBL process. Forty-nine EFL students participated in the study while they conducted research projects in English. The Non-wiki group (n = 25) received traditional inquiry instr...

متن کامل

From Sentence Meanings to Full Semantics

1 (1) Two approaches to meanings (2) The lifting lemma (3) When is the lifting an extension? (4) Applications to artificial languages (5) Applications to natural languages (6) References 2 1. Two approaches to meanings A. Aristotle (4th c. BC) Separate words have meanings. The meaning of a sentence is the result of combining the meanings of the words in it. The meaning of a sentence is the embo...

متن کامل

Why Does Unsupervised Pre-training Help Deep Learning?

Much recent research has been devoted to learning algorithms for deep architectures such as Deep Belief Networks and stacks of auto-encoder variants, with impressive results obtained in several areas, mostly on vision and language data sets. The best results obtained on supervised learning tasks involve an unsupervised learning component, usually in an unsupervised pre-training phase. Even thou...

متن کامل

Towards The Semantics Of Sentence Adverbials

In the present paper we argue that the so-called sentence adverbials (typically, adverbs like probably, admittedly,...) should be generated, in the framework of Functional Generative Description, by means of a special deep case Complementation of Attitude (CA) on grounds of their special behaviour in the topic-focus articulation (TFA) of a sentence. From the viewpoint of the translation of CA e...

متن کامل

A Rudimentary Lexicon and Semantics Help Bootstrap Phoneme Acquisition

Infants spontaneously discover the relevant phonemes of their language without any direct supervision. This acquisition is puzzling because it seems to require the availability of high levels of linguistic structures (lexicon, semantics), that logically suppose the infants having a set of phonemes already. We show how this circularity can be broken by testing, in realsize language corpora, a sc...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2021

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v35i15.17619